54 research outputs found

    CUDA DSP Filter for ECG Signals

    Get PDF
    Real time processing is very important and critical for analysis of ECG signals. Prior to each processing, the signal needs to be filtered to enable feature extraction and further analysis. In case of building a data processing center that analyzes thousands of connected ECG sensors, one expects that the signal processing needs to be done very fast. In this paper, we focus on parallelizing the sequential DSP filter for processing of heart signals on GPU cores. We set a hypothesis that a GPU version is much faster than the CPU version. In this paper we have provided several experiments to test the validity of this hypothesis and to compare the performance of the parallelized GPU code with the sequential code. Assuming that the hypothesis is valid, we would also like to find what is the optimal size of the threads per block to obtain the maximum speedup. Our analysis shows that parallelized GPU code achieves linear speedups and is much more efficient than the classical single processor sequential processing

    A mobile application for ECG detection and feature extraction

    Get PDF
    This paper presents a system for early detection and alerting of the onset of a heart attack. The system consists of a wireless, easy wearable and mobile ECG biosensor, a cloud based data center, smartphone and web application. A significant part in the system is the 24h health monitoring and care provided by expert cardiac physicians. The system predicts potential heart attack and sends risk alerts to the medical experts for assessment. If a potential heart attack risk exists, ambulance is being called with the coordinates of the cardiac patient wearing the sensor. The timely reaction can prevent serious tissue damage or even death to the users of the system

    E-Testing based on service oriented architecture

    Get PDF
    The extensive use of technology in learning and working, is forcing its use in the assessment process. A lot of software packages exist in the market to realize automated assessment. Several of them are very comprehensive, but most of them are stand alone applications without possibilities for interoperability, adaptability according to learner characteristics and possibilities for content reuse. In this paper we describe the purposes and the process of designing an interoperable E-Testing Framework by remodelling an existing E-Testing system and introducing new structured Service Oriented Architecture, based on encapsulating existing business functions as loosely coupled, reusable, platform-independent services which collectively realize required business objectives. This common framework should support interoperable content, exchange of data and learner profiles, and give the possibility for search and retrieval of any data bank content in local and remote repositories

    Architecture for Distributed Simulation Environment

    Get PDF
    AbstractThe article describes a custom made architecture for distributed simulation. Initially intended for Skopje Bicycle Inter-modality simulation, it has proven itself as being universal, adaptable and usable for many different purposes. It supports not only the set of simulation models, GUIs and data representations, but also many users running simulations at the same time. Such architecture brings certain challenges forth. The authors discuss methods for dealing with these challenges

    E-Deposit in Academic Use

    Get PDF
    The e-Deposit is a deposit that can be managed in electronic form. The concept of a fund accessible for different financial transactions makes the e-Deposit appropriate for use at universities. The application is implemented in three-tier architecture. Because of the extensive exchange of financial data over the internet, the data integrity is secured. The business and data access tier are implemented in a modular manner helping the robustness of the application and reducing the risk of unwanted behavior. Special modules are used, that enable the integration of an e-commerce scenario in an already existing university information system

    Comparison of 24 h ECG Holter Monitoring with Real-time Long-term ECG Monitoring System using ECGalert Software and Savvy Single-Lead Patch

    Get PDF
    AIM: The aim of the study was to show non-inferiority of the single-channel ECGalert system to the gold standard (ECG Holter) in the detection of arrhythmias over the total wear time of both devices. METHODS: A prospective study enrolled a total of 165 patients hospitalized at the University Clinic of Cardiology, who underwent simultaneous single-channel ECG recording with ECGAlert system and a conventional 24 h Holter monitor on the 1st day and continued ECGAlert monitoring for few more days, under assignment of the doctor or at the wish of the patient. RESULTS: A total of 165 patients were included in the study, 61.2% male, mean age of 58.4 Ā± 12.7 years. Mean duration of ECG Holter monitoring was 23.2 Ā± 0.5 h and mean duration of ECGalert/Savvy monitoring was 64.6 Ā± 31.2. During the first 24 h of simultaneous ECG monitoring with both methods, no statistically significant difference was found in arrhythmia detection. Over the total wear time of both devices, the ECGalert system detected significantly more AF episodes as compared to Holter (p < 0.000). ECGalert demonstrated significantly lower detection rate of false pauses (0.001). However, false detection of episodes of VT or AF was significantly higher in ECGalert system versus Holter (p < 0.000 and p < 0.000 respectively). Patients were more satisfied with ECGalert system, due to lesser interference in daily activities. CONCLUSION: The ECGalert system demonstrated superiority over traditional Holter monitoring in arrhythmia detection in the total monitoring period, but not in the first 24 h

    A New Model for Semiautomatic Student Source Code Assessment

    Get PDF
    Programming courses at university and high school level, and competitions in informatics (programming), often require fast assessment of the received programming tasks solutions. This problem is usually solved by the use of automated systems that check the produced output for some test cases for every solution.In this paper, we present a new model for semiautomatic student source codes assessment for a given programming task, based on our approach of representation of the program codes as vectors. It represents a human and computer collaborative effort. Our research on the use of these vectors in data mining analysis of the source codes, with the achieved excellent results on the number of correctly clustered items, is a solid foundation for the proposed model. At the end, we present the results of the preliminary testing of the model
    • ā€¦
    corecore